When Your Therapist Uses ChatGPT - AI Steps Into Mental Health

Posted on November 07, 2025 at 09:56 PM

“When Your Therapist Uses ChatGPT: AI Steps Into Mental Health”

Artificial intelligence is stepping into spaces once thought strictly human — including mental health care. According to a recent Washington Post article, some licensed therapists are now experimenting with AI-powered chatbots, such as ChatGPT, using them both personally and professionally — and occasionally recommending them to clients in a guided way. (washingtonpost.com)


Therapy Meets AI

Traditionally, AI in therapy raised concerns: chatbots misreading suicidal ideation, giving inappropriate advice, or misdiagnosing users. Yet some therapists are cautiously embracing its potential.

  • Manhattan therapist Jack Worthy uses ChatGPT to analyze his dream journal, spotting stress patterns he might have missed.
  • Baltimore-based Nathalie Savell employs AI to support clients managing anxiety and relationship issues between sessions.
  • Both emphasize that AI supplements — it does not replace — human-led therapy.

Why AI Is Gaining Ground

Therapists cite several advantages:

  • Accessibility: AI is available 24/7, allowing reflection or journaling when a human therapist isn’t reachable.
  • Self-exploration: Chatbots can ask probing questions, helping users clarify thoughts and patterns.
  • Client trends: While overall use remains small, therapists report clients increasingly turning to AI between sessions.

Risks and Cautions

Despite its promise, AI in therapy has limits:

  • AI struggles with serious mental health issues like suicidal thoughts, sometimes providing irrelevant responses.
  • Nonverbal cues, tone, and emotional “holding space” — central to therapy — are beyond AI’s capabilities.
  • Overreliance can encourage overthinking, with clients analyzing chatbot responses obsessively rather than processing them constructively.

The Emerging Hybrid Model

The article highlights a potential future where AI and human therapists work in tandem:

  • Hybrid care: AI provides reflection tools; therapists interpret and contextualize insights.
  • Ethical considerations: Clear boundaries, informed consent, and regulatory oversight remain crucial.
  • Empowered clients: Discussing AI-derived insights in therapy sessions can deepen understanding and engagement.

Practical Advice

  • Use AI for journaling and self-reflection, not as a standalone therapy.
  • Share AI-generated insights with a licensed therapist for context and guidance.
  • If crisis or suicidal thoughts arise, stop using chatbots and seek immediate professional help.
  • Ensure anyone guiding AI use has appropriate clinical expertise.

Glossary

  • LLM (Large Language Model): AI trained on vast text datasets to generate human-like responses. Used here to support introspection and guided self-analysis.
  • Generative AI: Technology capable of producing new content (text, images, responses) based on learned patterns.
  • Journaling: Writing down thoughts and feelings regularly; AI can guide prompts to enhance reflection.
  • Supplemental therapy: Tools or practices used alongside primary therapy, not as replacements.

Conclusion

AI chatbots are reshaping mental health care — not by replacing therapists, but by complementing them. With balance, human oversight, and clear boundaries, AI offers new ways for clients to reflect, process, and engage with their mental health.

Source: Washington Post